# Tagalog Pretrained
Bert Tagalog Base Uncased
This is a pretrained language model for Tagalog, trained on multisource data, suitable for natural language processing tasks.
Large Language Model
Transformers

B
GKLMIP
320
2
Roberta Tagalog Base
This is a pretrained language model for Tagalog, trained on multi-source data, aimed at improving the performance of Tagalog natural language processing tasks.
Large Language Model
Transformers

R
GKLMIP
23
1
Featured Recommended AI Models